Principal component analysis using QR decomposition
نویسندگان
چکیده
In this paper we present QR based principal component analysis (PCA) method. Similar to the singular value decomposition (SVD) based PCA method this method is numerically stable. We have carried out analytical comparison as well as numerical comparison (on Matlab software) to investigate the performance (in terms of computational complexity) of our method. The computational complexity of SVD based PCA is around 14dn flops (where d is the dimensionality of feature space and n is the number of training feature vectors); whereas the computational complexity of QR based PCA is around 2dn þ 2dth flops (where t is the rank of data covariance matrix and h is the dimensionality of reduced feature space). It is observed that the QR based PCA is more efficient in terms of computational complexity.
منابع مشابه
Exploratory factor and principal component analyses: some new aspects
Exploratory Factor Analysis (EFA) and Principal Component Analysis (PCA) are popular techniques for simplifying presentation of, and investigating structure of, an (n×p) data matrix. However, these fundamentally different techniques are frequently confused, and the differences between them are obscured, because they give similar results in some practical cases. We therefore investigate conditio...
متن کاملStreaming Sparse Principal Component Analysis
1. Preliminaries Theorem A-1. (Theorem 3.1, (Chang, 2012)) Let A ∈ Rm×n be of full column rank with QR factorization A = QR, ∆A be a perturbation in A, and A + ∆A = (Q + ∆Q)(R + ∆R) be the QR-factorization of A + ∆A. Let PA and PA⊥ be the orthogonal projectors onto the range of A and the orthogonal complement of the range of A, respectively. LetQ⊥ be an orthonormal matrix such that matrix [Q,Q⊥...
متن کاملHolographic Blind Watermarking Algorithm of Three- Dimensional Mesh Model Based on QR Decomposition
This paper proposes a holographic blind watermarking algorithm of the three-dimensional mesh model based on QR decomposition. Firstly the three-dimensional model is pre-processed by moving the model center to the coordinate origin and doing PCA analysis. Secondly the model’s geometric feature matrix is constructed by using the distance from the three-dimensional mesh model vortex to the model c...
متن کاملOptimum Signal Processing 2002 – 2007 SVD , PCA , KLT , CCA , and All That
1 Vector and Matrix Norms, 2 2 Subspaces, Bases, and Projections, 3 3 The Fundamental Theorem of Linear Algebra, 7 4 Solving Linear Equations, 7 5 The Singular Value Decomposition, 13 6 Moore-Penrose Pseudoinverse, 18 7 Least-Squares Problems and the SVD, 20 8 Condition Number, 22 9 Reduced-Rank Approximation, 23 10 Regularization of Ill-Conditioned Problems, 29 11 SVD and Signal Processing, 30...
متن کاملLarge Scale Canonical Correlation Analysis with Iterative Least Squares
Canonical Correlation Analysis (CCA) is a widely used statistical tool with both well established theory and favorable performance for a wide range of machine learning problems. However, computing CCA for huge datasets can be very slow since it involves implementing QR decomposition or singular value decomposition of huge matrices. In this paper we introduce L-CCA , a iterative algorithm which ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Int. J. Machine Learning & Cybernetics
دوره 4 شماره
صفحات -
تاریخ انتشار 2013